Indexed by:
Abstract:
An accurate and reliable multi-step prediction model for dam displacement prediction can provide decision- makers with crucial forecast information, mitigating safety risks associated with abnormal displacements. However, conventional deep learning models overlook the variability exhibited by the lag effect of dam displacement in continuous time domains. The detrimental effect of this limitation on model performance is further amplified in the dam displacement multi-step prediction scenario. To address this issue, this paper proposes a sequence-to-sequence Chrono-initialized long short-term memory (S2S-CLSTM), which can take into account the dynamic nature of dam displacement lag effect in explaining the nonlinear relationship between displacement and complex external features. Specifically, we redefine LSTM initialization using the Chrono initializer and construct a sequence-to-sequence model (S2S-CLSTM) based on the Chrono-initialized Long shortterm memory network (CLSTM). CLSTM adapts to dynamic displacement lag effect by incorporating temporal dependency information in implicit parameters, enhancing model generalization. The S2S paradigm aids in maintaining robustness while interpreting dam displacements across different time scales. Secondly, the study introduce a special Re-optimization strategy tailored for S2S-CLSTM to mitigate performance degradation caused by ambiguous definitions of displacement lag extents. Using monitoring data from a real arch dam, the effectiveness of model and method is verified. Within the prediction step range of 3 to 7, the S2S-CLSTM achieves an impressive average R2 of 0.764, with MAE and RMSE values of only 0.623 mm and 0.797 mm, respectively. In addition, this work also explores the influence of the dam section location on the displacement lag effect model to emphasize the importance of the proposed methods in dam displacement multi-step prediction.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
EXPERT SYSTEMS WITH APPLICATIONS
ISSN: 0957-4174
Year: 2025
Volume: 271
7 . 5 0 0
JCR@2023
CAS Journal Grade:2
Cited Count:
WoS CC Cited Count: 1
SCOPUS Cited Count: 2
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: