Federated Learning (FL) enables decentralized model training without centralized data collection, but high communication overhead remains a key challenge, particularly in bandwidth-constrained environments like IoT and edge networks. Existing FL methods transmit full model updates, which leads to high communication costs. In this paper, we propose Genetic Algorithm-based Selective Parameter Updates (GASPU), a novel approach that uses a Genetic Algorithm (GA) to selectively transmit model parameter updates, significantly reducing communication overhead while maintaining competitive accuracy. GASPU optimizes binary masks allowing only the most effective parameters to be sent. We validate the GASPU approach on the HAR and KWS datasets, which are representative of realistic FL settings. While achieving a 66% reduction in communication overhead over 100 communication rounds on HAR (from 0.49 MB to 0.16 MB) and a reduction to (from 1.27 MB to 0.44 MB) on KWS, GASPU maintained competitive acc...

Federated Learning (FL) enables decentralized model training without centralized data collection, but high communication overhead remains a key challenge, particularly in bandwidth-constrained environments like IoT and edge networks. Existing FL methods transmit full model updates, which leads to high communication costs. In this paper, we propose Genetic Algorithm-based Selective Parameter Updates (GASPU), a novel approach that uses a Genetic Algorithm (GA) to selectively transmit model parameter updates, significantly reducing communication overhead while maintaining competitive accuracy. GASPU optimizes binary masks allowing only the most effective parameters to be sent. We validate the GASPU approach on the HAR and KWS datasets, which are representative of realistic FL settings. While achieving a 66% reduction in communication overhead over 100 communication rounds on HAR (from 0.49 MB to 0.16 MB) and a reduction to (from 1.27 MB to 0.44 MB) on KWS, GASPU maintained competitive accuracy with only 10% drop. Existing methods achieve higher accuracies (above 80%) but at significantly higher communication costs. Further experiments on the MNIST benchmarking dataset confirm GASPU ’s generalizability, achieving only a 0.25% drop in accuracy and a 52% reduction in communication overhead over 10 communication rounds.

A Genetic Algorithm-Based Parameter Selection for Communication-Efficient Federated Learning / Hassan, Mir; Yildirim, Kasim Sinan; Iacca, Giovanni. - 15613:(2025), pp. 437-453. ( 28th European Conference on Applications of Evolutionary Computation, EvoApplications 2025, held as part of EvoStar 2025 Trieste 23rd April-25th April 2025) [10.1007/978-3-031-90065-5_27].

A Genetic Algorithm-Based Parameter Selection for Communication-Efficient Federated Learning

Mir Hassan;Kasim Sinan Yildirim;Giovanni Iacca
2025-01-01

Abstract

Federated Learning (FL) enables decentralized model training without centralized data collection, but high communication overhead remains a key challenge, particularly in bandwidth-constrained environments like IoT and edge networks. Existing FL methods transmit full model updates, which leads to high communication costs. In this paper, we propose Genetic Algorithm-based Selective Parameter Updates (GASPU), a novel approach that uses a Genetic Algorithm (GA) to selectively transmit model parameter updates, significantly reducing communication overhead while maintaining competitive accuracy. GASPU optimizes binary masks allowing only the most effective parameters to be sent. We validate the GASPU approach on the HAR and KWS datasets, which are representative of realistic FL settings. While achieving a 66% reduction in communication overhead over 100 communication rounds on HAR (from 0.49 MB to 0.16 MB) and a reduction to (from 1.27 MB to 0.44 MB) on KWS, GASPU maintained competitive acc...
2025
Applications of Evolutionary Computation (EvoApplications 2025)
Cham
Springer Science and Business Media Deutschland GmbH
9783031900648
9783031900655
Hassan, Mir; Yildirim, Kasim Sinan; Iacca, Giovanni
A Genetic Algorithm-Based Parameter Selection for Communication-Efficient Federated Learning / Hassan, Mir; Yildirim, Kasim Sinan; Iacca, Giovanni. - 15613:(2025), pp. 437-453. ( 28th European Conference on Applications of Evolutionary Computation, EvoApplications 2025, held as part of EvoStar 2025 Trieste 23rd April-25th April 2025) [10.1007/978-3-031-90065-5_27].
File in questo prodotto:
File Dimensione Formato  
paper_54.pdf

embargo fino al 17/04/2026

Descrizione: Manuscript
Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 917.87 kB
Formato Adobe PDF
917.87 kB Adobe PDF   Visualizza/Apri
978-3-031-90065-5_27.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 806.02 kB
Formato Adobe PDF
806.02 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/452530
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex 0
social impact