Indexed by:
Abstract:
Machine learning, particularly the neural network (NN), is extensively exploited in dizzying applications. In order to reduce the burden of computing for resource-constrained clients, a large number of historical private datasets are required to be outsourced to the semi-trusted or malicious cloud for model training and evaluation. To achieve privacy preservation, most of the existing work either exploited the technique of public key fully homomorphic encryption (FHE) resulting in considerable computational cost and ciphertext expansion, or secure multiparty computation (SMC) requiring multiple rounds of interactions between user and cloud. To address these issues, in this article, a lightweight privacy-preserving model training and evaluation scheme LPTE for discretized NNs (DiNNs) is proposed. First, we put forward an efficient single key fully homomorphic data encapsulation mechanism (SFH-DEM) without exploiting public key FHE. Based on SFH-DEM, a series of atomic calculations over the encrypted domain, including multivariate polynomial, nonlinear activation function, gradient function, and maximum operations are devised as building blocks. Furthermore, a lightweight privacy-preserving model training and evaluation scheme LPTE for DiNNs is proposed, which can also be extended to convolutional NN. Finally, we give the formal security proofs for dataset privacy, model training privacy, and model evaluation privacy under the semi-honest environment and implement the experiment on real dataset MNIST for recognizing handwritten numbers in DiNN to demonstrate the high efficiency and accuracy of our proposed LPTE.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE INTERNET OF THINGS JOURNAL
ISSN: 2327-4662
Year: 2020
Issue: 4
Volume: 7
Page: 2663-2678
9 . 4 7 1
JCR@2020
8 . 2 0 0
JCR@2023
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:149
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 14
SCOPUS Cited Count: 15
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: