Indexed by:
Abstract:
In recent years, feature selection (FS) methods based on sparsity have been extensively investigated due to their high performance. These methods solve the FS problem mainly by introducing some kinds of sparsity regularization terms. However, recent existing feature selection algorithms combine sparsity regularization with simple linear loss function, which may lead to deficient in performance. To this end, we propose a fresh and robust feature selection method that combines the structured sparsity regularization, i.e., 2, 0 -norm regularization, with the Softmax model to find a stable row-sparse solution, where we can select the features in group according to the solution of the projected matrix, and the classification performance can be improved by Softmax. Extensive experiments on six different datasets indicate that our method can obtain better or comparable classification performance by using fewer features compared with other advanced sparsity-based FS methods. © 2022, Springer Nature Singapore Pte Ltd.
Keyword:
Reprint 's Address:
Email:
Source :
ISSN: 1865-0929
Year: 2022
Volume: 1515 CCIS
Page: 37-48
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: