Indexed by:
Abstract:
Jointing multi-source data for model training can improve the accuracy of neural network. To solve the raising privacy concerns caused by data sharing, data are generally encrypted and outsourced to a group of cloud servers for computing and processing. In this client-cloud architecture, we propose FPPNet, a fast and privacy-preserving neural network for secure inference on sensitive data. FPPNet is deployed in three cloud servers, who collaboratively execute privacy computing via three-party arithmetic secret sharing. We develop the secure conversion method between additive shares and multiplicative shares, and propose three secure protocols to calculate non-linear functions, such as comparison, exponent and division that are superior to prior three-party works. Some secure modules for running convolutional, ReLU, max-pooling and Sigmoid layers are designed to implement FPPNet. We theoretically analyze the security and complexity of the proposed protocols. With MNIST dataset and two types of neural networks, experimental results validate that our FPPNet is faster than the related works, and the accuracy is the same as that of plaintext neural network. © 2022, ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering.
Keyword:
Reprint 's Address:
Email:
Source :
ISSN: 1867-8211
Year: 2022
Volume: 451 LNICST
Page: 165-178
Language: English
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 2
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: