Indexed by:
Abstract:
Existing multi-person pose estimation methods show a tendency for the parameters in the model to become increasingly large in order to improve generalisation performance, requiring huge computational resources. Therefore, we propose a lightweight multi-person pose estimation method called PKDHP (Process Knowledge Distillation for Human Pose estimation), which applies knowledge distillation to pose estimation. PKDHP treats the training process of student models in knowledge distillation as a human learning process, specifically, PKDHP proposes the CABF module to replace the ABF module in ReviewKD aiming to guide the lower level feature through the model's higher level feature, and proposes the Transfer knowledge distillation method to force the student model to imitate the process of feature transfer from the teacher model, thus further simulating the human learning style. Compared with other knowledge distillation methods, we use the same model parameters on two multi-person pose estimation datasets (COCO and MPII) to achieve higher performance. © 2023 SPIE.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
ISSN: 0277-786X
Year: 2023
Volume: 12707
Language: English
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 5
Affiliated Colleges: