Indexed by:
Abstract:
. Non-convex optimization regularized with a sparsity function has many applications in machine learning and other fields. Non-convex relaxation of the sparsity function often leads to more effective sparse optimization algorithms. In this paper, we propose a new weighted regularizer to approximate the sparsity function, and derive a weighted thresholding operator for the sparse optimization problem with the regularizer. Then, iterative weighted thresholding algorithms are designed, followed with an acceleration by using Nesterov's acceleration method and non-monotone line search. Under the Kurdyka- Lojasiewicz (KL) property, the smoothness and the appropriate convexity assumptions, we prove that the two algorithms are convergent and the convergence rates are O(1/k) and O(1/k2) respectively, where k is the iteration counter. Moreover, we develop convergent practical homotopy algorithms by invoking the two iterative weighted thresholding algorithms as subroutines respectively. A series of numerical experiments demonstrate that our algorithms are superior in both average recovery rate and average running time for the sparse recovery problem, and are competitive in solution quality and average running time for the logistic regression problem, compared to state-of-the-art algorithms.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION
ISSN: 1547-5816
Year: 2025
Issue: 5
Volume: 21
Page: 3541-3579
1 . 2 0 0
JCR@2023
CAS Journal Grade:4
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: