FedSel: Federated SGD Under Local Differential Privacy with Top-k Dimension Selection

89Citations
Citations of this article
70Readers
Mendeley users who have this article in their library.
Get full text

Abstract

As massive data are produced from small gadgets, federated learning on mobile devices has become an emerging trend. In the federated setting, Stochastic Gradient Descent (SGD) has been widely used in federated learning for various machine learning models. To prevent privacy leakages from gradients that are calculated on users’ sensitive data, local differential privacy (LDP) has been considered as a privacy guarantee in federated SGD recently. However, the existing solutions have a dimension dependency problem: the injected noise is substantially proportional to the dimension d. In this work, we propose a two-stage framework FedSel for federated SGD under LDP to relieve this problem. Our key idea is that not all dimensions are equally important so that we privately select Top-k dimensions according to their contributions in each iteration of federated SGD. Specifically, we propose three private dimension selection mechanisms and adapt the gradient accumulation technique to stabilize the learning process with noisy updates. We also theoretically analyze privacy, accuracy and time complexity of FedSel, which outperforms the state-of-the-art solutions. Experiments on real-world and synthetic datasets verify the effectiveness and efficiency of our framework.

References Powered by Scopus

The algorithmic foundations of differential privacy

5310Citations
N/AReaders
Get full text

Federated machine learning: Concept and applications

4567Citations
N/AReaders
Get full text

Randomized Response: A Survey Technique for Eliminating Evasive Answer Bias

2462Citations
N/AReaders
Get full text

Cited by Powered by Scopus

From distributed machine learning to federated learning: a survey

208Citations
N/AReaders
Get full text

Blockchain-Based Federated Learning for Securing Internet of Things: A Comprehensive Survey

173Citations
N/AReaders
Get full text

FLAME: Differentially Private Federated Learning in the Shuffle Model

71Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Liu, R., Cao, Y., Yoshikawa, M., & Chen, H. (2020). FedSel: Federated SGD Under Local Differential Privacy with Top-k Dimension Selection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12112 LNCS, pp. 485–501). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-59410-7_33

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 30

83%

Researcher 5

14%

Lecturer / Post doc 1

3%

Readers' Discipline

Tooltip

Computer Science 32

82%

Engineering 6

15%

Mathematics 1

3%

Save time finding and organizing research with Mendeley

Sign up for free