FedEco: Achieving Energy-Efficient Federated Learning by Hyperparameter Adaptive Tuning

Publisher:
Institute of Electrical and Electronics Engineers (IEEE)
Publication Type:
Journal Article
Citation:
IEEE Transactions on Cognitive Communications and Networking, 2024, PP, (99), pp. 1-1
Issue Date:
2024-01-01
Filename Description Size
1736486.pdfPublished version2.83 MB
Adobe PDF
Full metadata record
In Federated Learning (FL), each participating client needs to frequently compute the local gradient updates and communicate with the central parameter server, which causes high energy consumption on clients. Such energy consumption brings a significant challenge to battery-constrained clients (such as mobile devices, notebooks, IoT sensors, etc.) for edge networks. To tackle this challenge, in this study, we propose a method named FedEco to minimize energy consumption of all participating clients in FL. FedEco has the following features: i). FedEco achieves energy consumption minimization through hyperparameter1 adaptive tuning, which means allocating several kinds of optimized hyperparameters to the FL clients periodically. ii). The used hyperparameters include computing hyperparameters (i.e., training speed and training volume) and communication hyperparameters (i.e., transmission speed and transmission volume). iii). FedEco can theoretically guarantee FL convergence while satisfying the heterogeneous and dynamic edge environments. iv). FedEco finds the optimal computing and communication hyperparameters by decomposing the studied problem into several subproblems and solving them iteratively. Therefore, FedEco can achieve a good tradeoff between energy efficiency and model training convergence. We conduct extensive simulations to show that FedEco significantly saves energy consumption by up to 87.04% compared with the benchmark algorithms on average while guaranteeing the convergence of model training.
Please use this identifier to cite or link to this item: