Indexed by:
Abstract:
In view of the lack of interactive information between aspect words and the context in most aspect-level sentiment studies, and the inability to make full use of semantic information. To address the problems above, a model based on self-attention and graph convolution network is proposed. In order to improve the semantic representation ability of the model, the multi-head self-attention mechanism is used to obtain the long-distance dependency relationship of the text, combined with the dependency type matrix. Then, the weight matrix that combines the location information and the relationship type information is calculated and is inputted to the graph convolution network to obtain the text feature representation. Besides, the text aspect attention layer is employed to extract the context-sensitive aspect features, and it is inputted to graph convolution network to obtain aspect feature representation. Finally, the two vectors above are connected to complete the task of sentiment analysis. Simulation results show that the overall performance of the proposed model is better than that these of other comparison models in two open datasets. © 2024 Beijing University of Posts and Telecommunications. All rights reserved.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
Journal of Beijing University of Posts and Telecommunications
ISSN: 1007-5321
CN: 11-3570/TN
Year: 2024
Issue: 1
Volume: 47
Page: 127-132
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: