Exploring the Practicality of Differentially Private Federated Learning: A Local Iteration Tuning Approach

Publisher:
Institute of Electrical and Electronics Engineers (IEEE)
Publication Type:
Journal Article
Citation:
IEEE Transactions on Dependable and Secure Computing, 2023, PP, (99), pp. 1-15
Issue Date:
2023-01-01
Full metadata record
Although Federated Learning (FL) prevents the exposure of original data samples when collaboratively training machine learning models among decentralized clients, it has been revealed that vanilla FL is still susceptible to adversarial attacks if model parameters are leaked to malicious attackers. To enhance the protection level of FL, Differential Private Federated Learning (DPFL) has been proposed in recent years. DPFL injects zero-mean noises randomly generated by differential private (DP) mechanisms on local model parameters before they are disclosed. Nevertheless, DP noises can significantly deteriorate model utility jeopardizing the practicality of DPFL. In this paper, we are among the first to explore how to improve the model utility of DPFL by tuning the number of local iterations (LIs) on DPFL clients. Our work shows that such a local iteration tuning approach can well mitigate the adverse influence of DP noises on the final model utility. Formally, we derive the sensitivity (a measure of the maximum change of the output given two adjacent inputs) with respect to the number of LIs conducted on DPFL clients for the Laplace mechanism, and the aggregated variances of Laplace noises at the server side. We further conduct convergence rate analysis to quantify the influence of the Laplace noises on the final model accuracy and determine how to optimally set the number of LIs. Finally, to verify our theoretical findings, we perform extensive experiments using three real-world datasets, namely, Lending Club, MNIST and Fashion-MNIST. The results not only corroborate our analysis, but also demonstrate that our approach significantly improves the practicality of DPFL.
Please use this identifier to cite or link to this item: