Indexed by:
Abstract:
Many inverse problems in machine learning, system identification, and image processing include nuisance parameters, which are important for the recovering of other parameters. Separable nonlinear optimization problems fall into this category. The special separable structure in these problems has inspired several efficient optimization strategies. A well-known method is the variable projection (VP) that projects out a subset of the estimated parameters, resulting in a reduced problem that includes fewer parameters. The expectation maximization (EM) is another separated method that provides a powerful framework for the estimation of nuisance parameters. The relationships between EM and VP were ignored in previous studies, though they deal with a part of parameters in a similar way. In this article, we explore the internal relationships and differences between VP and EM. Unlike the algorithms that separate the parameters directly, the hierarchical identification algorithm decomposes a complex model into several linked submodels and identifies the corresponding parameters. Therefore, this article also studies the difference and connection between the hierarchical algorithm and the parameter-separated algorithms like VP and EM. In the numerical simulation part, Monte Carlo experiments are performed to further compare the performance of different algorithms. The results show that the VP algorithm usually converges faster than the other two algorithms and is more robust to the initial point of the parameters.
Keyword:
Reprint 's Address:
Version:
Source :
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS
ISSN: 2168-2216
Year: 2022
Issue: 11
Volume: 52
Page: 7236-7247
8 . 7
JCR@2022
8 . 6 0 0
JCR@2023
ESI Discipline: ENGINEERING;
ESI HC Threshold:66
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 7
SCOPUS Cited Count: 8
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 4
Affiliated Colleges: