Indexed by:
Abstract:
Feature selection plays a critical role in many machine learning applications as it effectively addresses the challenges posed by "the curse of dimensionality" and enhances the generalization capability of trained models. However, existing approaches for multi-class feature selection (MFS) often combine sparse regularization with a simple classification model, such as least squares regression, which can result in suboptimal performance. To address this limitation, this paper introduces a novel MFS method called Sparse Softmax Feature Selection ((SFS)-F-2). (SFS)-F-2 combines a l(2,0)-norm regularization with the Softmax model to perform feature selection. By utilizing the l(2,0)-norm, (SFS)-F-2 produces a more precise sparsity solution for the feature selection matrix. Additionally, the Softmax model improves the interpretability of the model's outputs, thereby enhancing the classification performance. To further enhance discriminative feature selection, a discriminative regularization, derived based on linear discriminate analysis (LDA), is incorporated into the learning model. Furthermore, an efficient optimization algorithm, based on the alternating direction method of multipliers (ADMM), is designed to solve the objective function of (SFS)-F-2. Extensive experiments conducted on various datasets demonstrate that (SFS)-F-2 achieves higher accuracy in classification tasks compared to several contemporary MFS methods.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
ISSN: 1868-8071
Year: 2024
Issue: 1
Volume: 16
Page: 159-172
3 . 1 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: