Indexed by:
Abstract:
With the rapid development of Ethereum, vast amounts of data are recorded on the blockchain through transactions, encompassing diverse and extensive textual information. While Long Short-Term Memory (LSTM) models have shown remarkable effectiveness in sentiment analysis tasks in recent years, they often encounter situations where different features have equal importance when processing such textual data. Therefore, this study introduces a Bidirectional LSTM model with a Multi-Head Attention mechanism (MABLSTM) designed for sentiment analysis tasks in Ethereum transaction texts. BLSTM consists of two distinct and independent LSTMs that consider information flow from two directions, capturing contextual information from both the past and the future. The outputs from the BLSTM layer are enhanced using a multi-head attention mechanism to amplify the importance of sentiment words and blockchain-specific terms. This paper evaluates the effectiveness of MABLSTM on Ethereum transaction data through experiments conducted on an Ethereum transaction dataset, comparing MABLSTM with CNN, SVM, ABLSTM and ABCDM. The results demonstrate the effectiveness and superiority of MABLSTM in sentiment analysis tasks. This approach accurately analyzes sentiment polarity in Ethereum transaction texts, providing valuable information for Ethereum participants and researchers to support decision-making and emotional analysis.
Keyword:
Reprint 's Address:
Version:
Source :
COGNITIVE COMPUTING - ICCC 2023
ISSN: 0302-9743
Year: 2024
Volume: 14207
Page: 100-115
0 . 4 0 2
JCR@2005
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: