Indexed by:
Abstract:
Graph representation learning is a crucial area in machine learning, with widespread applications in social networks, recommendation systems, and traffic flow prediction. Recently, Graph Transformers have emerged as powerful tools for this purpose, garnering significant attention. In this work, we observe a fundamental issue of previous Graph Transformers that they overlook the scale-related information gap and often employ an identical attention computation method for different-scale node interactions, leading to suboptimality of model performance. To address this, we propose a Multi-Scale Attention Graph Transformer (MSA-GT) that enables each node to conduct adaptive interactions conditioned on different scales from both local and global perspectives. Specifically, MSA-GT guides several attention mechanisms to focus on individual scales and then perform customized combinations via an attention-based fusion module, thereby obtaining much more semantically fine-grained node representations. Despite the potential of the above design, we still observe over-fitting to some extent, which is a typical challenge for training Graph Transformers. We propose two additional technical components to prevent over-fitting and improve the performance further. We first introduce a path-based pruning strategy to reduce ineffective attention interactions, facilitating more accurate relevant node selection. Additionally, we propose a Heterophilous Curriculum Augmentation (HCA) module, which gradually increases the training difficulty, forming a weak-to-strong regularization schema and therefore enhancing the model's generalization ability step-by-step. Extensive experiments show that our method outperforms many state-of-the-art methods on eight public graph benchmarks, proving its effectiveness. © 2024 Elsevier B.V.
Keyword:
Reprint 's Address:
Email:
Source :
Knowledge-Based Systems
ISSN: 0950-7051
Year: 2025
Volume: 309
7 . 2 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: