Indexed by:
Abstract:
Federated learning is a machine learning prgadigm that enables the collaborative learning among clients while keeping the privacy of clients' data. Federated multitask learning (FMTL) deals with the statistic challenge of non-independent and identically distributed (IID) data by training a personalized model for each client, and yet requires all the clients to be always online in each training round. To eliminate the limitation of full-participation, we explore multitask learning associated with model clustering, and first propose a clustered FMTL to achieve the multual-task learning on non-IID data, while simultaneously improving the communication efficiency and the model accuracy. To enhance its privacy, we adopt a general dual-server architecture and further propose a secure clustered FMTL by designing a series of secure two-party computation protocols. The convergence analysis and security analysis is conducted to prove the correctness and security of our methods. Numeric evaluation on public data sets validates that our methods are superior to state-of-the-art methods in dealing with non-IID data while protecting the privacy.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE INTERNET OF THINGS JOURNAL
ISSN: 2327-4662
Year: 2023
Issue: 4
Volume: 10
Page: 3453-3467
8 . 2
JCR@2023
8 . 2 0 0
JCR@2023
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:32
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 11
SCOPUS Cited Count: 16
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: