D2P-Fed: Differentially Private Federated Learning With Efficient Communication
Ruoxi Jia
Abstract
In this paper, we propose the discrete Gaussian based differentially private federated learning (D2P-Fed), a unified scheme to achieve both differential privacy (DP) and communication efficiency in federated learning (FL). In particular, compared with the only prior work taking care of both aspects, D2P-Fed provides stronger privacy guarantee, better composability and smaller communication cost. The key idea is to apply the discrete Gaussian noise to the private data transmission. We provide complete analysis of the privacy guarantee, communication cost and convergence rate of D2P-Fed. We evaluated D2P-Fed on INFIMNIST and CIFAR10. The results show that D2P-Fed outperforms the-state-of-the-art by 4.7% to 13.0% in terms of model accuracy while saving one third of the communication cost.
People
Publication Details
- Date of publication:
- January 2, 2021
- Journal:
- Cornell University
- Publication note:
Lun Wang, Ruoxi Jia: Private Distributed Mean Estimation. CoRR abs/2006.13039 (2020)