Indexed by:
Abstract:
The non-local attention has attracted widespread attention from researchers in enhancing the ability of deep single-image super-resolution methods to mine self-similarity information. However, non-local attention often faces problems of high computational complexity and inaccurate calculation of cross-correlation of deep features when capturing long-distance information. To solve these problems, we propose an innovative and efficient Weighted Adaptive Clustering Attention (WACA). Specifically, WACA consists of Sparse Adaptive Clustering Attention (SACA) and Weighted Residual Attention (WRA). SACA significantly reduces the noise signal in the process of feature correlation calculation with the help of asymmetric local sensitive hashing, and reduces the computational cost from quadratic to asymptotically linear with the sequence length. In addition, our designed WRA utilizes the high correlation characteristics of the self-similarity matrix between self-attention layers, and further significantly reduces the computational consumption of associated non-local information by co-optimizing the self-similarity matrix. To verify the effectiveness of WACA, we introduce corresponding modules on a residual backbone and construct a framework named Weighted Adaptive Clustering Network (WACN). Experimental results demonstrate that WACN has competitive performance in both quantitative and qualitative evaluations.
Keyword:
Reprint 's Address:
Version:
Source :
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024
ISSN: 2161-4393
Year: 2024
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: