Open Access
ARTICLE
Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms
1 Faculty of Electrical and Electronics Engineering, Universiti Tun Hussein Onn Malaysia, Parit Raja, 86400, Johor, Malaysia
2 School of Mathematical Sciences, Universiti Sains Malaysia USM, 11800, Penang, Malaysia
* Corresponding Author: Kim Gaik Tay. Email:
Computer Systems Science and Engineering 2023, 47(1), 1163-1184. https://doi.org/10.32604/csse.2023.038912
Received 03 January 2023; Accepted 03 March 2023; Issue published 26 May 2023
Abstract
Radial Basis Function Neural Network (RBFNN) ensembles have long suffered from non-efficient training, where incorrect parameter settings can be computationally disastrous. This paper examines different evolutionary algorithms for training the Symbolic Radial Basis Function Neural Network (SRBFNN) through the behavior’s integration of satisfiability programming. Inspired by evolutionary algorithms, which can iteratively find the near-optimal solution, different Evolutionary Algorithms (EAs) were designed to optimize the producer output weight of the SRBFNN that corresponds to the embedded logic programming 2Satisfiability representation (SRBFNN-2SAT). The SRBFNN’s objective function that corresponds to Satisfiability logic programming can be minimized by different algorithms, including Genetic Algorithm (GA), Evolution Strategy Algorithm (ES), Differential Evolution Algorithm (DE), and Evolutionary Programming Algorithm (EP). Each of these methods is presented in the steps in the flowchart form which can be used for its straightforward implementation in any programming language. With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). Based on the results, the EP algorithm achieved a higher training rate and simple structure compared with the rest of the algorithms. It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.