Home>Results

  • Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

[会议论文]

Automatic text summarization based on transformer and switchable normalization

Share
Edit Delete 报错

author:

Luo, T. (Luo, T..) [1] | Guo, K. (Guo, K..) [2] | Guo, H. (Guo, H..) [3]

Indexed by:

Scopus

Abstract:

With the development of text summarization research, the methods based on RNN with the Encoder-Decoder model gradually become the mainstream. However, RNN tends to forget previous context information and leads to the lack of original information. That will reduce the accuracy of the generated summarizations. The Transformer model uses the self-attention mechanism to encode and decode historical information so that it can achieve better performance than RNN in learning context information. In this paper, a text summarization model based on transformer and switchable normalization is proposed. The accuracy of the model is improved by optimizing the normalization layer. Compared with other models, the new model has a great advantage in understanding words' semantics and associations. Experimental results in English Gigaword dataset show that the proposed model can achieve high ROUGE values and make the summarization more readable. © 2019 IEEE.

Keyword:

Attention; Encoder-decoder model; Switchable normalization; Text summarization; Transformer

Community:

  • [ 1 ] [Luo, T.]College of Mathematics and Computer Sciences, Fuzhou University, Fuzhou, China
  • [ 2 ] [Guo, K.]College of Mathematics and Computer Sciences, Fuzhou University, Fuzhou, China
  • [ 3 ] [Guo, H.]College of Mathematics and Computer Sciences, Fuzhou University, Fuzhou, China

Reprint 's Address:

Show more details

Source :

SocialCom 2019

Year: 2019

Page: 1606-1611

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 2

30 Days PV: 0

Affiliated Colleges:

Online/Total:323/10752924
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1