In the past few years, Federated Learning (FL) has emerged as an effective approach for training neural networks (NNs) over a computing network while preserving data privacy. Most of the existing FL approaches require the user to define a priori the same structure for all the NNs running on the clients, along with an explicit aggregation procedure. This can be a limiting factor in cases where pre-defining such algorithmic details is difficult. To overcome these issues, we propose a novel approach to FL, which leverages Neuroevolution running on the clients. This implies that the NN structures may be different across clients, hence providing better adaptation to the local data. Furthermore, in our approach, the aggregation is implicitly accomplished on the client side by exploiting the information about the models used on the other clients, thus allowing the emergence of optimal NN architectures without needing an explicit aggregation. We test our approach on three datasets, showing that very compact NNs can be obtained without significant drops in performance compared to canonical FL. Moreover, we show that such compact structures allow for a step towards explainability, which is highly desirable in domains such as digital health, from which the tested datasets come.

NEvoFed: A Decentralized Approach to Federated NeuroEvolution of Heterogeneous Neural Networks / Custode, Leonardo Lucio; De Falco, Ivanoe; Della Cioppa, Antonio; Iacca, Giovanni; Scafuri, Umberto. - (2024), pp. 295-303. (Intervento presentato al convegno GECCO '24 tenutosi a Melbourne Australia nel 14th-18th July 2024) [10.1145/3638529.3654029].

NEvoFed: A Decentralized Approach to Federated NeuroEvolution of Heterogeneous Neural Networks

Custode, Leonardo Lucio;Iacca, Giovanni;
2024-01-01

Abstract

In the past few years, Federated Learning (FL) has emerged as an effective approach for training neural networks (NNs) over a computing network while preserving data privacy. Most of the existing FL approaches require the user to define a priori the same structure for all the NNs running on the clients, along with an explicit aggregation procedure. This can be a limiting factor in cases where pre-defining such algorithmic details is difficult. To overcome these issues, we propose a novel approach to FL, which leverages Neuroevolution running on the clients. This implies that the NN structures may be different across clients, hence providing better adaptation to the local data. Furthermore, in our approach, the aggregation is implicitly accomplished on the client side by exploiting the information about the models used on the other clients, thus allowing the emergence of optimal NN architectures without needing an explicit aggregation. We test our approach on three datasets, showing that very compact NNs can be obtained without significant drops in performance compared to canonical FL. Moreover, we show that such compact structures allow for a step towards explainability, which is highly desirable in domains such as digital health, from which the tested datasets come.
2024
GECCO '24: Proceedings of the Genetic and Evolutionary Computation Conference
New York, NY, USA
ACM
979-8-4007-0494-9
Custode, Leonardo Lucio; De Falco, Ivanoe; Della Cioppa, Antonio; Iacca, Giovanni; Scafuri, Umberto
NEvoFed: A Decentralized Approach to Federated NeuroEvolution of Heterogeneous Neural Networks / Custode, Leonardo Lucio; De Falco, Ivanoe; Della Cioppa, Antonio; Iacca, Giovanni; Scafuri, Umberto. - (2024), pp. 295-303. (Intervento presentato al convegno GECCO '24 tenutosi a Melbourne Australia nel 14th-18th July 2024) [10.1145/3638529.3654029].
File in questo prodotto:
File Dimensione Formato  
GECCO2024_NEAT_FL.pdf

accesso aperto

Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 224.79 kB
Formato Adobe PDF
224.79 kB Adobe PDF Visualizza/Apri
3638529.3654029.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 247.3 kB
Formato Adobe PDF
247.3 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/418991
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex 0
social impact