Revisiting Ensembling in One-Shot Federated Learning
Dec 10, 2024ยท,,,,,,
ยท
0 min read
Youssef Allouah
Akash Dhasade
Rachid Guerraoui
Nirupam Gupta
Anne-Marie Kermarrec
Rafael Pinot
Rafael Pires

Rishi Sharma
Abstract
Federated Learning (FL) is an appealing approach to training machine learning models without sharing raw data. However, standard FL algorithms are iterative and thus induce a significant communication cost. One-Shot FL (OFL) trades the iterative exchange of models between clients and the server with a single round of communication, thereby saving substantially on communication costs. Not surprisingly, OFL exhibits a performance gap in terms of accuracy with respect to FL, especially under high data heterogeneity. We introduce Fens, a novel federated ensembling scheme that approaches the accuracy of FL with the communication efficiency of OFL. Learning in Fens proceeds in two phases: first, clients train models locally and send them to the server, similar to OFL; second, clients collaboratively train a lightweight prediction aggregator model using FL. We showcase the effectiveness of Fens through exhaustive experiments spanning several datasets and heterogeneity levels. In the particular case of heterogeneously distributed CIFAR-10 dataset, Fens achieves up to a 26.9% higher accuracy over SOTA OFL, being only 3.1% lower than FL. At the same time, Fens incurs at most 4.3x more communication than OFL, whereas FL is at least 10.9x more communication-intensive than Fens.
Type
Publication
38th Annual Conference on Neural Information Processing Systems (NeurIPS)