Indexed by:
Abstract:
Separable nonlinear models are pervasively employed in diverse disciplines, such as system identification, signal analysis, electrical engineering, and machine learning. Identifying these models inherently poses a non-convex optimization challenge. While gradient descent (GD) is a commonly adopted method, it is often plagued by suboptimal convergence rates and is highly dependent on the appropriate choice of step size. To mitigate these issues, we introduce an augmented GD algorithm enhanced with Anderson acceleration (AA), and propose a Hierarchical GD with Anderson acceleration (H-AAGD) method for efficient identification of separable nonlinear models. This novel approach transcends the conventional step size constraints of GD algorithms and considers the coupling relationships between different parameters during the optimization process, thereby enhancing the efficiency of the solution-finding process. Unlike the Newton method, our algorithm obviates the need for computing the inverse of the Hessian matrix, simplifying the computational process. Additionally, we theoretically analyze the convergence and complexity of the algorithm and validate its effectiveness through a series of numerical experiments.
Keyword:
Reprint 's Address:
Version:
Source :
NONLINEAR DYNAMICS
ISSN: 0924-090X
Year: 2024
Issue: 10
Volume: 113
Page: 11371-11387
5 . 2 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1