Home>Results

  • Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

[会议论文]

Process Knowledge Distillation for Multi-Person Pose Estimation

Share
Edit Delete 报错

author:

Zhao, Zhifeng (Zhao, Zhifeng.) [1] | Yan, Zheng (Yan, Zheng.) [2] | She, MingLei (She, MingLei.) [3] | Unfold

Indexed by:

EI Scopus

Abstract:

Existing multi-person pose estimation methods show a tendency for the parameters in the model to become increasingly large in order to improve generalisation performance, requiring huge computational resources. Therefore, we propose a lightweight multi-person pose estimation method called PKDHP (Process Knowledge Distillation for Human Pose estimation), which applies knowledge distillation to pose estimation. PKDHP treats the training process of student models in knowledge distillation as a human learning process, specifically, PKDHP proposes the CABF module to replace the ABF module in ReviewKD aiming to guide the lower level feature through the model's higher level feature, and proposes the Transfer knowledge distillation method to force the student model to imitate the process of feature transfer from the teacher model, thus further simulating the human learning style. Compared with other knowledge distillation methods, we use the same model parameters on two multi-person pose estimation datasets (COCO and MPII) to achieve higher performance. © 2023 SPIE.

Keyword:

Computer vision Distillation Learning systems Parameter estimation

Community:

  • [ 1 ] [Zhao, Zhifeng]College of Physics and Information Engineering, Fuzhou University, Fuzhou; 350116, China
  • [ 2 ] [Yan, Zheng]College of Physics and Information Engineering, Fuzhou University, Fuzhou; 350116, China
  • [ 3 ] [She, MingLei]College of Physics and Information Engineering, Fuzhou University, Fuzhou; 350116, China
  • [ 4 ] [Chen, Guodong]College of Physics and Information Engineering, Fuzhou University, Fuzhou; 350116, China

Reprint 's Address:

Show more details

Version:

Related Article:

Source :

ISSN: 0277-786X

Year: 2023

Volume: 12707

Language: English

Cited Count:

WoS CC Cited Count:

30 Days PV: 1

Affiliated Colleges:

Online/Total:51/10060905
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1