We are pleased to share that SUTD Faculty Fellow Marie Siew, and her collaborators have won the Best Poster Award, at the 44th IEEE International Conference on Distributed Computing Systems (ICDCS) 2024, for their project “Optimal Variance-Reduced Client Sampling for Multiple Model Federated Learning”.
Federated learning (FL) is a variant of distributed learning in which multiple clients collaborate to learn a global model without sharing their data with the central server. In real world scenarios, a client may be involved in training multiple unrelated FL models, which we call multi-model federated learning (MMFL), and the client sampling strategy and task allocation are crucial for improving system performance. In this paper, we propose an optimal sampling method to minimize the variance of global updates for unbiased learning in MMFL systems. The resulting method achieves an average accuracy of over 30% higher than other baseline methods, as we demonstrate through simulations on real-world federated datasets.
Haoran Zhang, Zekai Li, Zejun Gong, Marie Siew, Carlee Joe-Wong, Rachid El-Azouzi, “Optimal Variance-Reduced Client Sampling for Multiple Model Federated Learning”, in Proc. IEEE International Conference on Distributed Computing Systems (ICDCS Posters), Jersey City, July 2024.
Paper link: https://haoran-zh.github.io/Publications/ICDCS_Poster_MMFL_optimal_sampling__gradient_norm_.pdf
About IEEE ICDCS:
The 44th IEEE International Conference on Distributed Computing Systems (ICDCS) is a premier international forum for researchers and practitioners to present, discuss and exchange cutting edge ideas and latest findings on topics related to any aspects of distributed computing systems.