• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Zhuang, Jianzhi (Zhuang, Jianzhi.) [1] | Li, Jin (Li, Jin.) [2] | Shi, Chenjunhao (Shi, Chenjunhao.) [3] | Lin, Xinyi (Lin, Xinyi.) [4] | Fu, Yang-Geng (Fu, Yang-Geng.) [5]

Indexed by:

EI

Abstract:

Graph representation learning is a crucial area in machine learning, with widespread applications in social networks, recommendation systems, and traffic flow prediction. Recently, Graph Transformers have emerged as powerful tools for this purpose, garnering significant attention. In this work, we observe a fundamental issue of previous Graph Transformers that they overlook the scale-related information gap and often employ an identical attention computation method for different-scale node interactions, leading to suboptimality of model performance. To address this, we propose a Multi-Scale Attention Graph Transformer (MSA-GT) that enables each node to conduct adaptive interactions conditioned on different scales from both local and global perspectives. Specifically, MSA-GT guides several attention mechanisms to focus on individual scales and then perform customized combinations via an attention-based fusion module, thereby obtaining much more semantically fine-grained node representations. Despite the potential of the above design, we still observe over-fitting to some extent, which is a typical challenge for training Graph Transformers. We propose two additional technical components to prevent over-fitting and improve the performance further. We first introduce a path-based pruning strategy to reduce ineffective attention interactions, facilitating more accurate relevant node selection. Additionally, we propose a Heterophilous Curriculum Augmentation (HCA) module, which gradually increases the training difficulty, forming a weak-to-strong regularization schema and therefore enhancing the model's generalization ability step-by-step. Extensive experiments show that our method outperforms many state-of-the-art methods on eight public graph benchmarks, proving its effectiveness. © 2024 Elsevier B.V.

Keyword:

Curricula Distribution transformers

Community:

  • [ 1 ] [Zhuang, Jianzhi]College of Computer and Data Science, Fuzhou University, Fuzhou; 350108, China
  • [ 2 ] [Li, Jin]AI Thrust, Information Hub, The Hong Kong University of Science and Technology (Guangzhou), Guangzhou, China
  • [ 3 ] [Shi, Chenjunhao]College of Computer and Data Science, Fuzhou University, Fuzhou; 350108, China
  • [ 4 ] [Lin, Xinyi]College of Computer and Data Science, Fuzhou University, Fuzhou; 350108, China
  • [ 5 ] [Fu, Yang-Geng]College of Computer and Data Science, Fuzhou University, Fuzhou; 350108, China

Reprint 's Address:

  • [fu, yang-geng]college of computer and data science, fuzhou university, fuzhou; 350108, china;;

Show more details

Related Keywords:

Related Article:

Source :

Knowledge-Based Systems

ISSN: 0950-7051

Year: 2025

Volume: 309

7 . 2 0 0

JCR@2023

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Affiliated Colleges:

Online/Total:88/10057916
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1