• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Chen, Kejia (Chen, Kejia.) [1] (Scholars:陈可嘉) | Huang, Chunxiang (Huang, Chunxiang.) [2] | Lin, Hongxi (Lin, Hongxi.) [3]

Indexed by:

EI Scopus PKU CSCD

Abstract:

In view of the lack of interactive information between aspect words and the context in most aspect-level sentiment studies, and the inability to make full use of semantic information. To address the problems above, a model based on self-attention and graph convolution network is proposed. In order to improve the semantic representation ability of the model, the multi-head self-attention mechanism is used to obtain the long-distance dependency relationship of the text, combined with the dependency type matrix. Then, the weight matrix that combines the location information and the relationship type information is calculated and is inputted to the graph convolution network to obtain the text feature representation. Besides, the text aspect attention layer is employed to extract the context-sensitive aspect features, and it is inputted to graph convolution network to obtain aspect feature representation. Finally, the two vectors above are connected to complete the task of sentiment analysis. Simulation results show that the overall performance of the proposed model is better than that these of other comparison models in two open datasets. © 2024 Beijing University of Posts and Telecommunications. All rights reserved.

Keyword:

Convolution Semantics Semantic Web Sentiment analysis

Community:

  • [ 1 ] [Chen, Kejia]School of Economics and Management, Fuzhou University, Fuzhou; 350108, China
  • [ 2 ] [Huang, Chunxiang]School of Economics and Management, Fuzhou University, Fuzhou; 350108, China
  • [ 3 ] [Lin, Hongxi]School of Business, Putian University, Putian; 351100, China

Reprint 's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Journal of Beijing University of Posts and Telecommunications

ISSN: 1007-5321

CN: 11-3570/TN

Year: 2024

Issue: 1

Volume: 47

Page: 127-132

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Online/Total:16/10042763
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1