• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Li, Jin (Li, Jin.) [1] | Zhang, Qirong (Zhang, Qirong.) [2] | Liu, Wenxi (Liu, Wenxi.) [3] (Scholars:刘文犀) | Chan, Antoni B. (Chan, Antoni B..) [4] | Fu, Yang-Geng (Fu, Yang-Geng.) [5] (Scholars:傅仰耿)

Indexed by:

Scopus SCIE

Abstract:

Graph neural networks (GNNs) are widely used for analyzing graph-structural data and solving graph-related tasks due to their powerful expressiveness. However, existing off-the-shelf GNN-based models usually consist of no more than three layers. Deeper GNNs usually suffer from severe performance degradation due to several issues including the infamous "over-smoothing" issue, which restricts the further development of GNNs. In this article, we investigate the over-smoothing issue in deep GNNs. We discover that over-smoothing not only results in indistinguishable embeddings of graph nodes, but also alters and even corrupts their semantic structures, dubbed semantic over-smoothing. Existing techniques, e.g., graph normalization, aim at handling the former concern, but neglect the importance of preserving the semantic structures in the spatial domain, which hinders the further improvement of model performance. To alleviate the concern, we propose a cluster-keeping sparse aggregation strategy to preserve the semantic structure of embeddings in deep GNNs (especially for spatial GNNs). Particularly, our strategy heuristically redistributes the extent of aggregations for all the nodes from layers, instead of aggregating them equally, so that it enables aggregate concise yet meaningful information for deep layers. Without any bells and whistles, it can be easily implemented as a plug-and-play structure of GNNs via weighted residual connections. Last, we analyze the over-smoothing issue on the GNNs with weighted residual structures and conduct experiments to demonstrate the performance comparable to the state-of-the-arts.

Keyword:

Aggregates Brain modeling Clustering Convolution deep graph neural networks (GNNs) Degradation node classification Numerical models over-smoothing Semantics sparse aggregation strategy Task analysis

Community:

  • [ 1 ] [Li, Jin]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350108, Peoples R China
  • [ 2 ] [Zhang, Qirong]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350108, Peoples R China
  • [ 3 ] [Liu, Wenxi]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350108, Peoples R China
  • [ 4 ] [Fu, Yang-Geng]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350108, Peoples R China
  • [ 5 ] [Li, Jin]Hong Kong Univ Sci & Technol Guangzhou, AI Thrust, Informat Hub, Guangzhou 511400, Peoples R China
  • [ 6 ] [Chan, Antoni B.]City Univ Hong Kong, Dept Comp Sci, Hong Kong, Peoples R China

Reprint 's Address:

  • [Fu, Yang-Geng]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350108, Peoples R China;;

Show more details

Version:

Related Keywords:

Related Article:

Source :

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS

ISSN: 2162-237X

Year: 2024

Issue: 4

Volume: 36

Page: 6897-6910

1 0 . 2 0 0

JCR@2023

Cited Count:

WoS CC Cited Count: 1

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 4

Online/Total:204/10019630
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1