Indexed by:
Abstract:
Backpropagation (BP) is a widely embraced method for training artificial neural networks (ANNs) in classification and regression tasks. However, its efficacy diminishes when confronted with complex problems due to susceptibility to local optima, sensitivity to initial parameters, and slow convergence. Arithmetic optimization algorithm (AOA) has emerged as a promising solution for achieving global optimization in ANNs. Nonetheless, similar to other metaheuristic search algorithms (MSAs), AOA grapples with inconsistent solution quality resulting from random initialization, impeding its effectiveness in intricate optimization tasks. This paper introduces a modified initialization scheme for AOA that incorporates chaotic maps and oppositional-based learning (OL) to yield a superior initial population, leading to a new AOA variant known as AOA-COIS. Serving as a training algorithm for ANNs, AOA-COIS optimizes the combination of weights, biases, and activation functions to enable accurate classification. Rigorous simulations evaluate AOA-COIS using ten publicly available medical datasets, comparing its performance with seven state-of-the-art MSA-based ANN training algorithms. The findings substantiate the remarkable capabilities of AOA-COIS in training ANN models, achieving outstanding classification accuracy across most selected medical datasets. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
ISSN: 2367-3370
Year: 2024
Volume: 845
Page: 329-341
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: