Privacy, accuracy, and model fairness trade-offs in federated learning

Publisher:
Elsevier
Publication Type:
Journal Article
Citation:
Computers and Security, 2022, 122, pp. 102907
Issue Date:
2022-11-01
Filename Description Size
1-s2.0-S0167404822003005-main.pdfPublished version1.86 MB
Adobe PDF
Full metadata record
As applications of machine learning become increasingly widespread, the need to ensure model accuracy and fairness while protecting the privacy of user data becomes more pronounced. On this note, this paper introduces a federated learning training model, which allow clients to simultaneously learn their model and update the associated parameters on a centralized server. In our approach, we seek to achieve an acceptable trade-off between privacy, accuracy, and model fairness by using differential privacy (DP), which also helps to minimize privacy risks by protecting the presence of a specific sample in the training data. Machine learning models can, however, exhibit unintended behaviors, such as unfairness, which result in groups with certain sensitive characteristics (e.g., gender) receiving different patterns of outcomes. Hence, we discuss the fairness and privacy effect of local DP and global DP when applied to federated learning by designing a fair and privacy quantification mechanism. In doing so, we can achieve an acceptable trade-off between accuracy, privacy, and model fairness. We quantify the level of fairness based on the constraints of three definitions of fairness, including demographic parity, equal odds, and equality of opportunity. Finally, findings from our extensive experiments conducted on three real-world datasets with class imbalance demonstrate the positive effect of local and global DP on fairness. Our study also shows that privacy can come at the cost of fairness, as stricter privacy can intensify discrimination. Hence, we posit that careful parameter selection can potentially help achieve a more effective trade-off between utility, bias, and privacy.
Please use this identifier to cite or link to this item: