FedGK: Communication-Efficient Federated Learning through Group-Guided Knowledge Distillation

  • Zhang W
  • Liu X
  • Tarkoma S
1Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Federated learning (FL) empowers a cohort of participating devices to contribute collaboratively to a global neural network model, ensuring that their training data remains private and stored locally. Despite its advantages in computational efficiency and privacy preservation, FL grapples with the challenge of non-IID (not independent and identically distributed) data from diverse clients, leading to discrepancies between local and global models and potential performance degradation. In this paper, we propose FedGK, an innovative communication-efficient Group-Guided FL framework designed for heterogeneous data distributions. FedGK employs a localized-guided framework that enables the client to effectively assimilate key knowledge from teachers and peers while minimizing extraneous peer information in FL scenarios. We conduct an in-depth analysis of the dynamic similarities among clients over successive communication rounds and develop a novel clustering approach that accurately groups clients with diverse heterogeneities. We implement FedGK on public datasets with an innovative data transformation pattern called “cluster-shift non-IID”, which mirrors the more prevalent data distributions in real-world settings and could be grouped into clusters with similar data distributions. Extensive experimental results on public datasets demonstrate that the proposed approach FedGK improves accuracy by up to 32.89% and saves up to 53.33% communication cost over state-of-the-art baselines.

References Powered by Scopus

Deep residual learning for image recognition

174329Citations
N/AReaders
Get full text

Robust and Communication-Efficient Federated Learning from Non-i.i.d. Data

1098Citations
N/AReaders
Get full text

Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization under Privacy Constraints

704Citations
N/AReaders
Get full text

Cited by Powered by Scopus

ADAMT: Adaptive distributed multi-task learning for efficient image recognition in Mobile Ad-hoc Networks

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zhang, W., Liu, X., & Tarkoma, S. (2024). FedGK: Communication-Efficient Federated Learning through Group-Guided Knowledge Distillation. ACM Transactions on Internet Technology. https://doi.org/10.1145/3674973

Readers' Seniority

Tooltip

Professor / Associate Prof. 3

100%

Readers' Discipline

Tooltip

Computer Science 3

100%

Article Metrics

Tooltip
Mentions
News Mentions: 1

Save time finding and organizing research with Mendeley

Sign up for free