• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Huang, Yisong (Huang, Yisong.) [1] | Li, Jin (Li, Jin.) [2] | Chen, Xinlong (Chen, Xinlong.) [3] | Fu, Yang-Geng (Fu, Yang-Geng.) [4] (Scholars:傅仰耿)

Indexed by:

EI

Abstract:

Recent studies have shown that Graph Transformers (GTs) can be effective for specific graph-level tasks. However, when it comes to node classification, training GTs remains challenging, especially in semi-supervised settings with a severe scarcity of labeled data. Our paper aims to address this research gap by focusing on semi-supervised node classification. To accomplish this, we develop a curriculum-enhanced attention distillation method that involves utilizing a Local GT teacher and a Global GT student. Additionally, we introduce the concepts of in-class and out-of-class and then propose two improvements, out-of-class entropy and top-k pruning, to facilitate the student's out-of-class exploration under the teacher's in-class guidance. Taking inspiration from human learning, our method involves a curriculum mechanism for distillation that initially provides strict guidance to the student and gradually allows for more out-of-class exploration by a dynamic balance. Extensive experiments show that our method outperforms many state-of-the-art methods on seven public graph benchmarks, proving its effectiveness. © 2024 12th International Conference on Learning Representations, ICLR 2024. All rights reserved.

Keyword:

Curricula Distillation Students

Community:

  • [ 1 ] [Huang, Yisong]College of Computer and Data Science, Fuzhou University, Fuzhou, China
  • [ 2 ] [Li, Jin]College of Computer and Data Science, Fuzhou University, Fuzhou, China
  • [ 3 ] [Li, Jin]AI Thrust, Information Hub, HKUST (Guangzhou), Guangzhou, China
  • [ 4 ] [Chen, Xinlong]College of Computer and Data Science, Fuzhou University, Fuzhou, China
  • [ 5 ] [Fu, Yang-Geng]College of Computer and Data Science, Fuzhou University, Fuzhou, China

Reprint 's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2024

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Online/Total:207/9981648
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1