Indexed by:
Abstract:
Federated learning is a popular framework designed to perform the distributed machine learning while protecting client privacy. However, the heterogeneous data distribution in real-world environments makes it difficult to converge when performing model training. In this article, we propose the federated gradient scheduling (FedGS), an improved historical gradient sampling utilization method for optimizers that utilize historical gradients in the federated learning to alleviate the instability problem of historical gradient information due to non-IID. FedGS consists of two main steps to improve federated learning performance. First, clustering uses clients' label distributions, which relabel clients and their submitting gradients. Second, sampling gradient clusters to generate an IID gradient set, which feeds to optimizers to derive valid momentum information. Besides, we introduce differential privacy to collaborate with FedGS to enhance clients' privacy protection strength. Compared to previous non-IID federated learning solutions, our method can achieve greater resistance to temporal non-IID. Moreover, experiments show that FedGS can achieve faster convergence and performance gain of up to 10% over existing state-to-art methods in some scenarios. FedGS can combine with existing methods easily to achieve better performance. We further verify that our method has robust performance gains in different non-IID scenarios, demonstrating the adaptability of FedGS for different scenarios.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE INTERNET OF THINGS JOURNAL
ISSN: 2327-4662
Year: 2023
Issue: 1
Volume: 10
Page: 747-762
8 . 2
JCR@2023
8 . 2 0 0
JCR@2023
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:32
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 11
SCOPUS Cited Count: 11
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: