• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Lyu, Lingjuan (Lyu, Lingjuan.) [1] | Chen, Chi-Hua (Chen, Chi-Hua.) [2]

Indexed by:

EI Scopus

Abstract:

The increasing demand for on-device deep learning necessitates the deployment of deep models on mobile devices. However, directly deploying deep models on mobile devices presents both capacity bottleneck and prohibitive privacy risk. To address these problems, we develop a Differentially Private Knowledge Distillation (DPKD) framework to enable on-device deep learning as well as preserve training data privacy. We modify the conventional Private Aggregation of Teacher Ensembles (PATE) paradigm by compressing the knowledge acquired by the ensemble of teachers into a student model in a differentially private manner. The student model is then trained on both the labeled, public data and the distilled knowledge by adopting a mixed training algorithm. Extensive experiments on popular image datasets, as well as the real implementation on a mobile device show that DPKD can not only benefit from the distilled knowledge but also provide a strong differential privacy guarantee (=2$) with only marginal decreases in accuracy. © 2020 ACM.

Keyword:

Data privacy Deep learning Distillation Distilleries Information retrieval

Community:

  • [ 1 ] [Lyu, Lingjuan]National University of Singapore, Singapore, Singapore
  • [ 2 ] [Chen, Chi-Hua]Fuzhou University, Fuzhou, China

Reprint 's Address:

Email:

Show more details

Version:

Related Keywords:

Source :

Year: 2020

Page: 1809-1812

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 12

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Online/Total:157/9985691
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1