Indexed by:
Abstract:
Dynamic graph representation learning has caused much attention in many practical applications. There is an interesting method that uses RNNS (e.g., LSTM, GRU) to update the GCN’s weights dynamically with weights from the previous time step. However, it only uses one time-step weights, which eventually leads to the lack of sufficient historical information. In this work, we focus on this method for the developing parameters and propose a developing GCN model, which adapts an attention mechanism to get richer historical information so that the RNNs will decode better fused historical representations to capture the temporal correlation of weights in the GCN, which not only can learn those dynamic graphs with fewer features, but also can extract richer historical information to learn. We evaluate our method on the task for link prediction and the result shows a better performance in most data sets we test. © 2021, Springer Nature Switzerland AG.
Keyword:
Reprint 's Address:
Email:
Source :
ISSN: 1865-0929
Year: 2021
Volume: 1517 CCIS
Page: 369-376
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: