WeightRelay: Efficient Heterogeneous Federated Learning on Time Series

Publisher:
Springer Nature
Publication Type:
Conference Proceeding
Citation:
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2024, 14471 LNAI, pp. 129-140
Issue Date:
2024-01-01
Filename Description Size
W.pdfPublished version1.52 MB
Adobe PDF
Full metadata record
Federated learning for heterogeneous devices aims to obtain models of various structural configurations in order to fit multiple devices according to their hardware configurations and external environments. Existing solutions train those heterogeneous models simultaneously, which requires extra cost (e.g. computation, communication, or data) to transfer knowledge between models. In this paper, we proposed a method, namely, weight relay (WeightRelay), that could get heterogeneous models without any extra training cost. Specifically, we find that, compared with the classic random weight initialization, initializing the weight of a large neural network with the weight of a well-trained small network could reduce the training epoch and still maintain a similar performance. Therefore, we could order models from the smallest and train them one by one. Each model (except the first one) can be initialized with the prior model’s trained weight for training cost reduction. In the experiment, we evaluate the weight relay on 128-time series datasets from multiple domains, and the result confirms the effectiveness of WeightRelay. More theoretical analysis and code can be found in (https://github.com/Wensi-Tang/DPSN/blob/master/AJCAI23_wensi_fedTSC.pdf ).
Please use this identifier to cite or link to this item: